Variable Selection with Akaike Information Criteria : a Comparative Study

نویسندگان

  • Meral Candan Çetin
  • Aydin Erar
چکیده

In this paper, the problem of variable selection in linear regression is considered. This problem involves choosing the most appropriate model from the candidate models. Variable selection criteria based on estimates of the Kullback-Leibler information are most common. Akaike’s AIC and bias corrected AIC belong to this group of criteria. The reduction of the bias in estimating the Kullback-Leibler information can lead to better variable selection. In this study we have compared the Akaike Criterion based on Fisher Information and AIC criteria based on Kullback-Leibler.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

An Akaike information criterion for multiple event mixture cure models

We derive the proper form of the Akaike information criterion for variable selection for mixture cure models, which are often fit via the expectation-maximization algorithm. Separate covariate sets may be used in the mixture components. The selection criteria are applicable to survival models for right-censored data with multiple competing risks and allow for the presence of an insusceptible gr...

متن کامل

Boosting Algorithms: Regularization, Prediction and Model Fitting

We present a statistical perspective on boosting. Special emphasis is given to estimating potentially complex parametric or nonparametric models, including generalized linear and additive models as well as regression models for survival analysis. Concepts of degrees of freedom and corresponding Akaike or Bayesian information criteria, particularly useful for regularization and variable selectio...

متن کامل

Empirical Bayes vs. Fully Bayes Variable Selection

For the problem of variable selection for the normal linear model, fixed penalty selection criteria such as AIC, Cp, BIC and RIC correspond to the posterior modes of a hierarchical Bayes model for various fixed hyperparameter settings. Adaptive selection criteria obtained by empirical Bayes estimation of the hyperparameters have been shown by George and Foster [2000. Calibration and Empirical B...

متن کامل

Bayesian model selection in ARFIMA models

Keywords: Bayesian model selection Reversible jump Markov chain Monte Carlo Autoregressive fractional integrated moving average models Long memory processes a b s t r a c t Various model selection criteria such as Akaike information criterion (AIC; Akaike, 1973), Bayesian information criterion (BIC; Akaike, 1979) and Hannan–Quinn criterion (HQC; Hannan, 1980) are used for model specification in...

متن کامل

On Measures of Information and Divergence and Model Selection Criteria

1: In this paper we discuss measures of information and divergence and model selection criteria. Three classes of measures, Fisher-type, divergencetype and entropy-type measures, are discussed and their properties are presented. Information through censoring and truncation is presented and model selection criteria are investigated including the Akaike Information Criterion (AIC) and the Diverge...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2006